Ordered Weighted L1 Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects
نویسندگان
چکیده
This paper studies the ordered weighted `1 (OWL) family of regularizers for sparse linear regression with strongly correlated covariates. We prove sufficient conditions for clustering correlated covariates, extending and qualitatively strengthening previous results for a particular member of the OWL family: OSCAR (octagonal shrinkage and clustering algorithm for regression). We derive error bounds for OWL with correlated Gaussian covariates: for cases in which clusters of covariates are strongly (even perfectly) correlated, but covariates in different clusters are uncorrelated, we show that if the true p-dimensional signal involves only s clusters, thenO(s log p) samples suffice to accurately estimate it, regardless of the number of coefficients within the clusters. Since the estimation of s-sparse signals with completely independent covariates also requires O(s log p) measurements, this shows that by using OWL regularization, we pay no price (in the number of measurements) for the presence of strongly correlated covariates.
منابع مشابه
Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects
This paper studies the ordered weighted `
متن کاملOrdered Weighted `1 Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects
This paper studies the ordered weighted `1 (OWL) family of regularizers for sparse linear regression with strongly correlated covariates. We prove sufficient conditions for clustering correlated covariates, extending and qualitatively strengthening previous results for a particular member of the OWL family: OSCAR (octagonal shrinkage and clustering algorithm for regression). We derive error bou...
متن کاملIdentifying Groups of Strongly Correlated Variables through Smoothed Ordered Weighted L1-norms
The failure of LASSO to identify groups of correlated predictors in linear regression has sparked significant research interest. Recently, various norms [1, 2] were proposed, which can be best described as instances of ordered weighted l1 norms (OWL) [3], as an alternative to l1 regularization used in LASSO. OWL can identify groups of correlated variables but it forces the model to be constant ...
متن کاملEfficient L1 Regularized Logistic Regression
L1 regularized logistic regression is now a workhorse of machine learning: it is widely used for many classification problems, particularly ones with many features. L1 regularized logistic regression requires solving a convex optimization problem. However, standard algorithms for solving convex optimization problems do not scale well enough to handle the large datasets encountered in many pract...
متن کاملFeature Selection via Block-Regularized Regression
Identifying co-varying causal elements in very high dimensional feature space with internal structures, e.g., a space with as many as millions of linearly ordered features, as one typically encounters in problems such as whole genome association (WGA) mapping, remains an open problem in statistical learning. We propose a block-regularized regression model for sparse variable selection in a high...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016